Efficient Dropout-Resilient Aggregation for Privacy-Preserving Machine Learning
نویسندگان
چکیده
With the increasing adoption of data-hungry machine learning algorithms, personal data privacy has emerged as one key concerns that could hinder success digital transformation. As such, Privacy-Preserving Machine Learning (PPML) received much attention from both academia and industry. However, organizations are faced with dilemma that, on hand, they encouraged to share enhance ML performance, but other potentially be breaching relevant regulations. Practical PPML typically allows multiple participants individually train their models, which then aggregated construct a global model in privacy-preserving manner, e.g., based multi-party computation or homomorphic encryption. Nevertheless, most important applications large-scale PPML, by aggregating clients' gradients update for federated learning, such consumer behavior modeling mobile application services, some inevitably resource-constrained devices, may drop out system due mobility nature. Therefore, resilience aggregation become an problem tackled. In this paper, we propose scalable scheme can tolerate dropout at any time, is secure against semi-honest active malicious adversaries setting proper parameters. By replacing communication-intensive building blocks seed pseudo-random generator, relying additive property Shamir secret sharing scheme, our outperforms state-of-the-art schemes up 6.37$\times$ runtime provides stronger dropout-resilience. The simplicity makes it attractive implementation further improvements.
منابع مشابه
Energy Efficient Secure & Privacy Preserving Data Aggregation for WSNs
The aim of this research work is to enhance wireless sensor network life time via reducing communication overhead. Sensor nodes have limited resources specially energy resource which is difficult or impossible to change/replace. As communication is by far the most energy consuming aspect in WSNs, one of the main goals to save energy is therefore to reduce communication overhead. Data aggregatio...
متن کاملPrivacy Preserving Machine Learning: Related Work
A practical scenario of PPML is where only one central party has the entire data on which the ML algorithm has to be learned. Agrawal and Ramakrishnan [1] proposed the first method to learn a Decision Tree classifier on a database without revealing any information about individual records. They consider public model private data setting where the algorithm and its parameters are public whereas ...
متن کاملPrivacy-Preserving Network Aggregation
Consider the scenario where information about a large network is distributed across several different parties (examples may include Facebook social networks or email communications networks). Intuitively, we would expect that the aggregate network formed by combining the individual private networks would be a more faithful representation of the underlying network as a whole. Thus, it would be u...
متن کاملInterpretable Machine Learning for Privacy-Preserving IoT and Pervasive Systems
The presence of pervasive computing in our everyday lives and emergence of the Internet of Things, such as the interaction of users with connected devices like smartphones or home appliances generate increasing amounts of traces that reflect users’ behavior. A plethora of machine learning techniques enable service providers to process these traces to extract latent information about the users. ...
متن کاملEncrypted statistical machine learning: new privacy preserving methods
We present two new statistical machine learning methods designed to learn on fully homomorphic encrypted (FHE) data. The introduction of FHE schemes following Gentry (2009) opens up the prospect of privacy preserving statistical machine learning analysis and modelling of encrypted data without compromising security constraints. We propose tailored algorithms for applying extremely random forest...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Forensics and Security
سال: 2023
ISSN: ['1556-6013', '1556-6021']
DOI: https://doi.org/10.1109/tifs.2022.3163592